The larger the size of the data, structured or unstructured, the harder to understand and make use\nof it. One of the fundamentals to machine learning is feature selection. Feature selection, by reducing\nthe number of irrelevant/redundant features, dramatically reduces the run time of a\nlearning algorithm and leads to a more general concept. In this paper, realization of feature selection\nthrough a neural network based algorithm, with the aid of a topology optimizer genetic algorithm,\nis investigated. We have utilized NeuroEvolution of Augmenting Topologies (NEAT) to select\na subset of features with the most relevant connection to the target concept. Discovery and\nimprovement of solutions are two main goals of machine learning, however, the accuracy of these\nvaries depends on dimensions of problem space. Although feature selection methods can help to\nimprove this accuracy, complexity of problem can also affect their performance. Artificialneural\nnetworks are proven effective in feature elimination, but as a consequence of fixed topology of\nmost neural networks, it loses accuracy when the number of local minimas is considerable in the\nproblem. To minimize this drawback, topology of neural network should be flexible and it should\nbe able to avoid local minimas especially when a feature is removed. In this work, the power of\nfeature selection through NEAT method is demonstrated. When compared to the evolution of\nnetworks with fixed structure, NEAT discovers significantly more sophisticated strategies. The\nresults show NEAT can provide better accuracy compared to conventional Multi-Layer Perceptron\nand leads to improved feature selection.
Loading....